879 research outputs found
NoVaS Transformations: Flexible Inference for Volatility Forecasting
In this paper we contribute several new results on the NoVaS transformation approach for volatility forecasting introduced by Politis (2003a,b, 2007). In particular: (a) we introduce an alternative target distribution (uniform); (b) we present a new method for volatility forecasting using NoVaS ; (c) we show that the NoVaS methodology is applicable in situations where (global) stationarity fails such as the cases of local stationarity and/or structural breaks; (d) we show how to apply the NoVaS ideas in the case of returns with asymmetric distribution; and finally (e) we discuss the application of NoVaS to the problem of estimating value at risk (VaR). The NoVaS methodology allows for a flexible approach to inference and has immediate applications in the context of short time series and series that exhibit local behavior (e.g. breaks, regime switching etc.) We conduct an extensive simulation study on the predictive ability of the NoVaS approach and find that NoVaS forecasts lead to a much ÔtighterÕ distribution of the forecasting performance measure for all data generating processes. This is especially relevant in the context of volatility predictions for risk management. We further illustrate the use of NoVaS for a number of real datasets and compare the forecasting performance of NoVaS -based volatility forecasts with realized and range-based volatility measures.ARCH, GARCH, local stationarity, structural breaks, VaR, volatility.
Algorithmic options for joint time-frequency analysis in structural dynamics applications
The purpose of this paper is to present recent research efforts by the authors supporting the superiority of joint time-frequency analysis over the traditional Fourier transform in the study of non-stationary signals commonly encountered in the fields of earthquake engineering, and structural dynamics. In this respect, three distinct signal processing techniques appropriate for the representation of signals in the time-frequency plane are considered. Namely, the harmonic wavelet transform, the adaptive chirplet decomposition, and the empirical mode decomposition, are utilized to analyze certain seismic accelerograms, and structural response records. Numerical examples associated with the inelastic dynamic response of a seismically-excited 3-story benchmark steel-frame building are included to show how the mean-instantaneous-frequency, as derived by the aforementioned techniques, can be used as an indicator of global structural damage
Recommended from our members
Joint time-frequency representation of simulated earthquake accelerograms via the adaptive chirplet transform
Seismic accelerograms are inherently nonstationary signals since both the intensity and frequency content of seismic events evolve in time. The adaptive chirplet transform is a signal processing technique for joint time-frequency representation of nonstationary data. Analysis of a signal via the adaptive chirplet decomposition in conjunction with the Wigner-Ville distribution yields the so-called adaptive spectrogram which constitutes a valid representation of the signal in the time-frequency plane. In this paper the potential of this technique for capturing the temporal evolution of the frequency content of strong ground motions is assessed. In this regard, simulated nonstationary earthquake accelerograms compatible with an exponentially modulated and appropriately filtered Kanai-Tajimi spectrum are processed using the adaptive chirplet transform. These are samples of a random process whose evolutionary power spectrum can be represented by an analytical expression. It is suggested that the average of the ensemble of the adaptive chirplet spectrograms can be construed as an estimate of the underlying evolutionary power spectrum. The obtained numerical results show, indeed, that the estimated evolutionary power spectrum is in a good agreement with the one defined analytically. This fact points out the potential of the adaptive chirplet analysis for as a tool for capturing localized frequency content of arbitrary data- banks of real seismic accelerograms
Association of Manganese Biomarker Concentrations with Blood Pressure and Kidney Parameters among Healthy Adolescents: NHANES 2013–2018
Deficiency or excess exposure to manganese (Mn), an essential mineral, may have potentially adverse health effects. The kidneys are a major organ of Mn site-specific toxicity because of their unique role in filtration, metabolism, and excretion of xenobiotics. We hypothesized that Mn concentrations were associated with poorer blood pressure (BP) and kidney parameters such as estimated glomerular filtration rate (eGFR), blood urea nitrogen (BUN), and albumin creatinine ratio (ACR). We conducted a cross-sectional analysis of 1931 healthy U.S. adolescents aged 12–19 years participating in National Health and Nutrition Examination Survey cycles 2013–2014, 2015–2016, and 2017–2018. Blood and urine Mn concentrations were measured using inductively coupled plasma mass spectrometry. Systolic and diastolic BP were calculated as the average of available readings. eGFR was calculated from serum creatinine using the Bedside Schwartz equation. We performed multiple linear regression, adjusting for age, sex, body mass index, race/ethnicity, and poverty income ratio. We observed null relationships between blood Mn concentrations with eGFR, ACR, BUN, and BP. In a subset of 691 participants, we observed that a 10-fold increase in urine Mn was associated with a 16.4 mL/min higher eGFR (95% Confidence Interval: 11.1, 21.7). These exploratory findings should be interpreted cautiously and warrant investigation in longitudinal studies
VerdictDB: Universalizing Approximate Query Processing
Despite 25 years of research in academia, approximate query processing (AQP)
has had little industrial adoption. One of the major causes of this slow
adoption is the reluctance of traditional vendors to make radical changes to
their legacy codebases, and the preoccupation of newer vendors (e.g.,
SQL-on-Hadoop products) with implementing standard features. Additionally, the
few AQP engines that are available are each tied to a specific platform and
require users to completely abandon their existing databases---an unrealistic
expectation given the infancy of the AQP technology. Therefore, we argue that a
universal solution is needed: a database-agnostic approximation engine that
will widen the reach of this emerging technology across various platforms.
Our proposal, called VerdictDB, uses a middleware architecture that requires
no changes to the backend database, and thus, can work with all off-the-shelf
engines. Operating at the driver-level, VerdictDB intercepts analytical queries
issued to the database and rewrites them into another query that, if executed
by any standard relational engine, will yield sufficient information for
computing an approximate answer. VerdictDB uses the returned result set to
compute an approximate answer and error estimates, which are then passed on to
the user or application. However, lack of access to the query execution layer
introduces significant challenges in terms of generality, correctness, and
efficiency. This paper shows how VerdictDB overcomes these challenges and
delivers up to 171 speedup (18.45 on average) for a variety of
existing engines, such as Impala, Spark SQL, and Amazon Redshift, while
incurring less than 2.6% relative error. VerdictDB is open-sourced under Apache
License.Comment: Extended technical report of the paper that appeared in Proceedings
of the 2018 International Conference on Management of Data, pp. 1461-1476.
ACM, 201
Energy Scenarios for South Eastern Europe: A close look into the Western Balkans
"The Energy Scenarios for South East Europe" thematic seminar took place on the 15th of December 2015 in Vienna, Austria. The workshop was organized by Institute of Energy and Transport of the European Commission's Joint Research Centre (JRC-IET), hosted by the Energy Community Secretariat (ECS) and sponsored by the Directorate-General for Neighbourhood and Enlargement Negotiations (DG-NEAR) in the framework of the Travel Accommodation and Conference facility for Western Balkans and Turkey, a programme of dissemination activities organised by the Commission in the EU or the beneficiary country in connection with the enlargement process and the pre-accession strategy. The aim of the workshop was to bring together representatives from think tanks, scientific institutes, the academia and the private sector with government officials, the national statistical agencies and the local TSO representatives from the Western Balkan region to exchange views on potential energy technology deployment scenarios that could facilitate a low carbon development pathway for the enlargement countries, but also exchange on the methodologies utilized and identify challenges as well as potential pitfalls in this process. The workshop included three sessions of specific thematic focus. The first session provided the "regional picture" with forecasts on the development of the energy and power systems in the western Balkans. The second session discussed case studies on low carbon development trajectories for specific countries in the region; and the third session explored the role of particular technologies in this context. This report comprises of long abstracts from the workshop presentations and closes with a chapter on conclusions and recommendations that resulted from the discussion sessions
Inconsistency of the MLE for the joint distribution of interval censored survival times and continuous marks
This paper considers the nonparametric maximum likelihood estimator (MLE) for
the joint distribution function of an interval censored survival time and a
continuous mark variable. We provide a new explicit formula for the MLE in this
problem. We use this formula and the mark specific cumulative hazard function
of Huang and Louis (1998) to obtain the almost sure limit of the MLE. This
result leads to necessary and sufficient conditions for consistency of the MLE
which imply that the MLE is inconsistent in general. We show that the
inconsistency can be repaired by discretizing the marks. Our theoretical
results are supported by simulations.Comment: 27 pages, 4 figure
- …